Title Subspace segmentation with a minimal square frobenius norm
نویسندگان
چکیده
We introduce a novel subspace segmentation method called Minimal Squared Frobenius Norm Representation (MSFNR). MSFNR performs data clustering by solving a convex optimization problem. We theoretically prove that in the noiseless case, MSFNR is equivalent to the classical Factorization approach and always classifies data correctly. In the noisy case, we show that on both synthetic and real-word datasets, MSFNR is much faster than most state-of-the-art methods while achieving comparable segmentation accuracy.
منابع مشابه
Inverse subspace problems with applications
Given a square matrix A, the inverse subspace problem is concerned with determining a closest matrix to A with a prescribed invariant subspace. When A is Hermitian, the closest matrix may be required to be Hermitian. We measure distance in the Frobenius norm and discuss applications to Krylov subspace methods for the solution of large-scale linear systems of equations and eigenvalue problems as...
متن کاملA generalization of the Moore-Penrose inverse related to matrix subspaces of Cn×m
A natural generalization of the classical Moore-Penrose inverse is presented. The so-called S-Moore-Penrose inverse of a m × n complex matrix A, denoted by AS, is defined for any linear subspace S of the matrix vector space Cn×m. The S-Moore-Penrose inverse AS is characterized using either the singular value decomposition or (for the nonsingular square case) the orthogonal complements with resp...
متن کاملComments on “ Is the Frobenius Matrix Norm Induced ? ”
In “Is the Frobenius Matrix Norm Induced?”, the authors ask whether the Frobenius and the norms are induced. There, they claimed that the Frobenius norm is not induced and, consequently, conjectured that the norm may not be induced. In this note, it is shown that the Frobenius norm is induced on particular matrix spaces. It is then shown that the norm is in fact induced on a particular matrix-v...
متن کاملFast Spectral Low Rank Matrix Approximation
In this paper, we study subspace embedding problem and obtain the following results: 1. We extend the results of approximate matrix multiplication from the Frobenius norm to the spectral norm. Assume matrices A and B both have at most r stable rank and r̃ rank, respectively. Let S be a subspace embedding matrix with l rows which depends on stable rank, then with high probability, we have ‖ASSB−A...
متن کاملLinear discriminant analysis using rotational invariant L1 norm
Linear discriminant analysis (LDA) is a well-known scheme for supervised subspace learning. It has been widely used in the applications of computer vision and pattern recognition. However, an intrinsic limitation of LDA is the sensitivity to the presence of outliers, due to using the Frobenius norm to measure the inter-class and intra-class distances. In this paper, we propose a novel rotationa...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012